Application of integral operator for regularized least-square regression

نویسندگان

  • Hongwei Sun
  • Qiang Wu
چکیده

In this paper, we study the consistency of the regularized least square regression in a general reproducing kernel Hilbert spaces. We characterized the compactness of the inclusion map from a reproducing kernel Hilbert space to the space of continuous functions and showed that the capacity based analysis by uniform covering numbers may fail in a very general setting. We prove the consistency and compute the learning rate by means of integral operator. To this end, we studied the properties of the integral operator. The analysis reveals that the essence of this approach is the isomorphism of the square root operator. ∗Corresponding author

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularized Least Square Regression with Spherical Polynomial Kernels

This article considers regularized least square regression on the sphere. It develops a theoretical analysis of the generalization performances of regularized least square regression algorithm with spherical polynomial kernels. The explicit bounds are derived for the excess risk error. The learning rates depend on the eigenvalues of spherical polynomial integral operators and on the dimension o...

متن کامل

Convergence Rate of Coefficient Regularized Kernel-based Learning Algorithms

We investigate machine learning for the least square regression with data dependent hypothesis and coefficient regularization algorithms based on general kernels. We provide some estimates for the learning raters of both regression and classification when the hypothesis spaces are sample dependent. Under a weak condition on the kernels we derive learning error by estimating the rate of some K-f...

متن کامل

Reproducing Kernel Hilbert Spaces in Learning Theory: the Sphere and the Hypercube

We analyze the regularized least square algorithm in learning theory with Reproducing Kernel Hilbert Spaces (RKHS). Explicit convergence rates for the regression and binary classification problems are obtained in particular for the polynomial and Gaussian kernels on the n-dimensional sphere and the hypercube. There are two major ingredients in our approach: (i) a law of large numbers for Hilber...

متن کامل

Optimal Rates for Regularized Least Squares Regression

We establish a new oracle inequality for kernelbased, regularized least squares regression methods, which uses the eigenvalues of the associated integral operator as a complexity measure. We then use this oracle inequality to derive learning rates for these methods. Here, it turns out that these rates are independent of the exponent of the regularization term. Finally, we show that our learning...

متن کامل

A Sharp Maximal Function Estimate for Vector-Valued Multilinear Singular Integral Operator

We establish a sharp maximal function estimate for some vector-valued multilinear singular integral operators. As an application, we obtain the $(L^p, L^q)$-norm inequality for vector-valued multilinear operators.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Mathematical and Computer Modelling

دوره 49  شماره 

صفحات  -

تاریخ انتشار 2009